23 research outputs found

    Logistic regression model for identifying factors affecting hospitalization of children with pneumonia

    Get PDF
    Pneumonia is a lung infection that could happen in babies, children, adults and older people. However, pneumonia in infants and older adults is more serious. Several studies found that infants are more likely to get pneumonia if they live in low-income families. The study aimed to identify factors that cause children to be hospitalized for pneumonia. The binary logistic regression analysis was performed to build a full model regardless of the significance of the variables. The forward selection approach was used to select the significant variables. It was found that the age of the mother, cigarette smoked by the mother during pregnancy, duration (in months) of the children on solid food, and the age when the child had pneumonia with the p-value of 0.0009, 0.0010, 0.0003 and less than 0.0001, respectively. The odds ratio of mother's age, cigarette smoked by mother during pregnancy, how many months the child on solid food, and children’s age when they had pneumonia are 0.69, 6.22, 0.40 and 0.60, respectively

    Comparing Outlier Detection Methods using Boxplot Generalized Extreme Studentized Deviate and Sequential Fences

    Get PDF
    Outliers identification is essential in data analysis since it can make wrong inferential statistics. This study aimed to compare the performance of Boxplot, Generalized Extreme Studentized Deviate (Generalized ESD), and Sequential Fences method in identifying outliers. A published dataset was used in the study. Based on preliminary outlier identification, the data did not contain outliers. Each outlier detection method's performance was evaluated by contaminating the original data with few outliers. The contaminations were conducted by replacing the two smallest and largest observations with outliers. The analysis was conducted using SAS version 9.2 for both original and contaminated data. We found that Sequential Fences have outstanding performance in identifying outliers compared to Boxplot and Generalized ESD

    Determination of flexibility of workers working time through Taguchi method approach

    Get PDF
    Human factor is one of the important elements in manufacturing world, despite their important role in improvement the production flow, they have been neglected while scheduling for many decades. In this paper the researchers taken the human factor throughout their job performance weightage into consideration while using job shop scheduling (JSS) for a factory of glass industry, in order to improving the workers' flexibility. In other hand, the researchers suggested a new sequence of workers' weightage by using Taguchi method, which present the best flexibility that workers can have, while decreasing the total time that the factory need to complete the whole production flow.

    An overview of multi-filters for eliminating impulse noise for digital images

    Get PDF
    An image through the digitization process is referred to as a digital image. The quality of the digital image may be degenerating due to interferences on the acquisition, transmission, extraction, etc. This attracted the attention of many researchers to study the causes of damage to the information in the image. In addition to finding cause of image damage, the researchers also looking for ways to overcome this problem. There are many filtering techniques that have been introduced to deal the damage to the information in the image. In addition to eliminating noise from the image, filtering techniques also aims to maintain the originality of the features in the image. Among the many research papers on image filtering there is a lack of review papers which are an important to facilitate researchers in understanding the differences in each filtering technique. Additionally, it helps researchers determine the direction of research conducted based on the results of previous research. Therefore, this paper presents a review of several filtering techniques that have been developed so far

    A Comparative Study on Whole Body Vibration (WBV) Comfort towards Compact Car Model through Data Mining Approach

    Get PDF
    Nowadays people of Malaysian spend a significant amount of time traveling by the vehicle to travel from one location to another location, and this could be the main reason to decrease minimal vibration for the comfort level in transportation. The vibration that generated while driving can influence pressure and eliminate the focus to the driver and passenger, and this is one of the main causes that can lead accidents on the roads. In this study, we investigate the effect of the vibration caused by the tire interaction with the road surface. The methodology focuses on the trends which occur on the vibration exposure that has been generated throughout the engine operating rpm range in both stationary and nonstationary conditions. An equation will be approached through the analysis to find the significant data that can be used in the process which is K-Means algorithm. Based on the trends of the experienced and exposed vibration, the model is able to differentiate the level of comfort between the clusters by grouping the level of vibration into five categories. To review the accuracy of classification data cluster, the K-Nearest Neighbor method and Analysis Linear Discriminant is used for shows the percentage accuracy of classification data have been a cluster. Later, the vibration for the three cars in this study which has analyzed, compared using the approach of analysis of variations (ANOVA)

    Leukaemia’s Cells Pattern Tracking Via Multi-phases Edge Detection Techniques

    Get PDF
    Edge detection involves identifying and tracing the sudden sharp discontinuities to extract meaningful information from an image. The purpose of this paper is to improve detecting the leukaemia edges in the blood cell image. Toward this end, two distinctive procedures are developed which are Ant Colony Optimization Algorithm and the gradient edge detectors (Sobel, Prewitt and Robert). The latter involves image filtering, binarization, kernel convolution filtering and image transformation. Meanwhile, ACO involves filtering, enhancement, detection and localisation of the edges. Finally, the performance of the edge detection methods ACO, Sobel, Prewitt and Robert is compared to determine the best edge detection method. The results revealed that the Prewitt edge detection method produced an optimal performance for detecting edges of leukaemia cells with a value of 107%. Meanwhile, the ACO, Sobel and Robert yielded performance results of 76%, 102% and 93% respectively. Overall findings indicated that the gradient edge detection methods are superior to the Ant Colony Optimization method

    Ensemble learning with imbalanced data handling in the early detection of capital markets

    Get PDF
    Research aims: This study aims to create an early detection model to predict events in the Indonesian capital market. Design/Methodology/Approach: A quantitative study comparing ensemble learning models with imbalanced data handling detected early capital market events. This study used five ensemble learning models—Random Forest, ExtraTrees, CatBoost, XGBoost, and LightGBM—to detect early events in the Indonesian capital market by handling imbalanced data, such as under sampling (RUS), oversampling (SMOTE, SMOTE-Broder, ADASYN), and over-under sampling (SMOTE-Tomek, SMOTE-ENN), weighted (class weight). Global and regional stock markets, commodities, exchange rates, technical indicators, sectoral indices, JCI leaders, MSCI, net buys of foreign stocks, national securities, and national share ownership all predicted the lowest return of Crisis Management Protocol (CMP) binary responses. Research findings: Hyperparameters and thresholds were tuned to produce the optimum model. The best model had the highest G-mean. ExtraTrees with SMOTE-ENN predicted the highest number of one-day events, with a G-Mean of 96.88%. LightGBM with SMOTE handling best predicted five-day events with an 89.21% G-Mean. With a G-Mean of 89.49%, CatBoost with SMOTE-Border handling was the best for a 15-day event. In addition, LightGBM with SMOTE-Tomek handling and 68.02% G-Mean was best for 30-day events. Further, performance evaluation scores decreased with increased prediction time. Theoretical contribution/Originality: This work relates more imbalance handling methods and ensemble learning to capital market early detection cases. Practitioner/Policy implication: Capital markets can indicate economic stability. Maintaining capital market efficacy and economic value requires a system to detect pressure. Research limitation/Implication: This study used ensemble learning models to predict capital market events 1, 5, 15, and 30 days ahead, assuming Indonesian working days. The model's forecast results are expected to be utilized to monitor the capital market and take precautions

    Increasing T-method accuracy through application of robust M-estimatior

    Get PDF
    Mahalanobis Taguchi System is an analytical tool involving classification, clustering as well as prediction techniques. T-Method which is part of it is a multivariate analysis technique designed mainly for prediction and optimization purposes. The good things about T-Method is that prediction is always possible even with limited sample size. In applying T-Method, the analyst is advised to clearly understand the trend and states of the data population since this method is good in dealing with limited sample size data but for higher samples or extremely high samples data it might have more things to ponder. T-Method is not being mentioned robust to the effect of outliers within it, so dealing with high sample data will put the prediction accuracy at risk. By incorporating outliers in overall data analysis, it may contribute to a non-normality state beside the entire classical methods breakdown. Considering the risk towards lower prediction accuracy, it is important to consider the risk of lower accuracy for the individual estimates so that the overall prediction accuracy will be increased. Dealing with that intention, there exist several robust parameters estimates such as M-estimator, that able to give good results even with the data contain or may not contain outliers in it. Generalized inverse regression estimator (GIR) also been used in this research as well as Ordinary Lease Square Method (OLS) as part of comparison study. Embedding these methods into T-Method individual estimates conditionally helps in enhancing the accuracy of the T-Method while analyzing the robustness of T-method itself. However, from the 3 main case studies been used within this analysis, it shows that T-Method contributed to a better and acceptable performance with error percentages range 2.5% ~ 22.8% between all cases compared to other methods. M-estimator is proved to be sensitive with data consist of leverage point in x-axis as well as data with limited sample size. Referring to these 3 case studies only, it can be concluded that robust M-estimator is not feasible to be applied into T-Method as of now. Further enhance analysis is needed to encounter issues such as Airfoil noise case study data which T -method contributed to highest error% prediction. Hence further analysis need to be done for better result review

    An overview of the fundamental approaches that yield several image denoising techniques

    Get PDF
    Digital image is considered as a powerful tool to carry and transmit information between people. Thus, it attracts the attention of large number of researchers, among them those interested in preserving the image features from any factors that may reduce the image quality. One of these factors is the noise which affects the visual aspect of the image and makes others image processing more difficult. Thus far, solving this noise problem remains a challenge for the researchers in this field. A lot of image denoising techniques have been introduced in order to remove the noise by taking care of the image features; in other words, getting the best similarity to the original image from the noisy one. However, the findings are still inconclusive. Beside the enormous amount of researches and studies which adopt several mathematical concepts (statistics, probabilities, modeling, PDEs, wavelet, fuzzy logic, etc.), there is also the scarcity of review papers which carry an important role in the development and progress of research. Thus, this review paper intorduce an overview of the different fundamental approaches that yield the several image-denoising techniques, presented with a new classification. Furthermore, the paper presents the different evaluation tools needed on the comparison between these techniques in order to facilitate the processing of this noise problem, among a great diversity of techniques and concepts

    Progression approach for image denoising

    Get PDF
    Removing noise from the image by retaining the details and features of this treated image remains a standing challenge for the researchers in this field. Therefore, this study is carried out to propose and implement a new denoising technique for removing impulse noise from the digital image, using a new way. This technique permits the narrowing of the gap between the original and the restored images, visually and quantitatively by adopting the mathematical concept ''arithmetic progression''. Through this paper, this concept is integrated into the image denoising, due to its ability in modelling the variation of pixels’ intensity in the image. The principle of the proposed denoising technique relies on the precision, where it keeps the uncorrupted pixels by using effective noise detection and converts the corrupted pixels by replacing them with other closest pixels from the original image at lower cost and with more simplicity
    corecore